A. Read the ‘Signals.csv’ as DataFrame and import required libraries
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
import pandas as pd
import numpy as np
sd = pd.read_csv('/content/drive/MyDrive/AIML Course/Introduction to Neural Networks/Part-+1%2C2%263+-+Signal.csv')
sd.shape
(1599, 12)
sd.head()
| Parameter 1 | Parameter 2 | Parameter 3 | Parameter 4 | Parameter 5 | Parameter 6 | Parameter 7 | Parameter 8 | Parameter 9 | Parameter 10 | Parameter 11 | Signal_Strength | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 7.4 | 0.70 | 0.00 | 1.9 | 0.076 | 11.0 | 34.0 | 0.9978 | 3.51 | 0.56 | 9.4 | 5 |
| 1 | 7.8 | 0.88 | 0.00 | 2.6 | 0.098 | 25.0 | 67.0 | 0.9968 | 3.20 | 0.68 | 9.8 | 5 |
| 2 | 7.8 | 0.76 | 0.04 | 2.3 | 0.092 | 15.0 | 54.0 | 0.9970 | 3.26 | 0.65 | 9.8 | 5 |
| 3 | 11.2 | 0.28 | 0.56 | 1.9 | 0.075 | 17.0 | 60.0 | 0.9980 | 3.16 | 0.58 | 9.8 | 6 |
| 4 | 7.4 | 0.70 | 0.00 | 1.9 | 0.076 | 11.0 | 34.0 | 0.9978 | 3.51 | 0.56 | 9.4 | 5 |
sd.tail()
| Parameter 1 | Parameter 2 | Parameter 3 | Parameter 4 | Parameter 5 | Parameter 6 | Parameter 7 | Parameter 8 | Parameter 9 | Parameter 10 | Parameter 11 | Signal_Strength | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1594 | 6.2 | 0.600 | 0.08 | 2.0 | 0.090 | 32.0 | 44.0 | 0.99490 | 3.45 | 0.58 | 10.5 | 5 |
| 1595 | 5.9 | 0.550 | 0.10 | 2.2 | 0.062 | 39.0 | 51.0 | 0.99512 | 3.52 | 0.76 | 11.2 | 6 |
| 1596 | 6.3 | 0.510 | 0.13 | 2.3 | 0.076 | 29.0 | 40.0 | 0.99574 | 3.42 | 0.75 | 11.0 | 6 |
| 1597 | 5.9 | 0.645 | 0.12 | 2.0 | 0.075 | 32.0 | 44.0 | 0.99547 | 3.57 | 0.71 | 10.2 | 5 |
| 1598 | 6.0 | 0.310 | 0.47 | 3.6 | 0.067 | 18.0 | 42.0 | 0.99549 | 3.39 | 0.66 | 11.0 | 6 |
sd.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 1599 entries, 0 to 1598 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Parameter 1 1599 non-null float64 1 Parameter 2 1599 non-null float64 2 Parameter 3 1599 non-null float64 3 Parameter 4 1599 non-null float64 4 Parameter 5 1599 non-null float64 5 Parameter 6 1599 non-null float64 6 Parameter 7 1599 non-null float64 7 Parameter 8 1599 non-null float64 8 Parameter 9 1599 non-null float64 9 Parameter 10 1599 non-null float64 10 Parameter 11 1599 non-null float64 11 Signal_Strength 1599 non-null int64 dtypes: float64(11), int64(1) memory usage: 150.0 KB
import warnings
warnings.filterwarnings("ignore")
import numpy as np
from numpy.random import seed
seed(1)
import pandas as pd
import seaborn as sns
import scipy.stats as stats
import matplotlib.pyplot as plt
import tensorflow as tf
import random
random.seed(1)
np.random.seed(1)
tf.random.set_seed(1)
from tensorflow import keras
from keras.models import Sequential
from keras.layers import Dense
from sklearn.model_selection import StratifiedKFold
%matplotlib inline
#Test Train Split
from sklearn.model_selection import train_test_split
#Feature Scaling library
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Activation, Dense
from tensorflow.keras import regularizers, optimizers
from tensorflow.keras import initializers
from sklearn.metrics import r2_score
from sklearn.preprocessing import LabelBinarizer
from tensorflow.keras.models import load_model
from tensorflow.keras.utils import custom_object_scope
from sklearn.preprocessing import MinMaxScaler
from sklearn.preprocessing import StandardScaler
from sklearn.metrics import classification_report
from sklearn.metrics import confusion_matrix
import random
from tensorflow.keras import backend
B. Check for missing values and print percentage for each attribute
round(sd.isnull().sum() / sd.isnull().count() * 100, 2)
Parameter 1 0.0 Parameter 2 0.0 Parameter 3 0.0 Parameter 4 0.0 Parameter 5 0.0 Parameter 6 0.0 Parameter 7 0.0 Parameter 8 0.0 Parameter 9 0.0 Parameter 10 0.0 Parameter 11 0.0 Signal_Strength 0.0 dtype: float64
sd.isnull().values.any()
False
#No missing values found in the dataset
C. Check for presence of duplicate records in the dataset and impute with appropriate method
sd_duplicates = sd.duplicated()
print('Number of duplicates:', sd_duplicates.sum())
Number of duplicates: 240
from scipy.stats import skew
# calculating skewness of each column
sd_sk = sd.skew()
# print the skewness of each column
print(sd_sk)
Parameter 1 0.982751 Parameter 2 0.671593 Parameter 3 0.318337 Parameter 4 4.540655 Parameter 5 5.680347 Parameter 6 1.250567 Parameter 7 1.515531 Parameter 8 0.071288 Parameter 9 0.193683 Parameter 10 2.428672 Parameter 11 0.860829 Signal_Strength 0.217802 dtype: float64
# As we find all the data are positively skewed, imputing the dataset with median to check if we could minimize duplicates
for column in sd.columns:
sd[column].fillna(sd[column].mean(), inplace=True)
# calculating skewness of each column
sd_sk = sd.skew()
# print the skewness of each column
print(sd_sk)
Parameter 1 0.982751 Parameter 2 0.671593 Parameter 3 0.318337 Parameter 4 4.540655 Parameter 5 5.680347 Parameter 6 1.250567 Parameter 7 1.515531 Parameter 8 0.071288 Parameter 9 0.193683 Parameter 10 2.428672 Parameter 11 0.860829 Signal_Strength 0.217802 dtype: float64
#As we did not see any imporvement with median imputation, droppign the duplicates into a new dataframe
sd.duplicated().value_counts()
False 1359 True 240 dtype: int64
sd_dedup=sd.drop_duplicates()
sd_dedup.value_counts()
Parameter 1 Parameter 2 Parameter 3 Parameter 4 Parameter 5 Parameter 6 Parameter 7 Parameter 8 Parameter 9 Parameter 10 Parameter 11 Signal_Strength
4.6 0.52 0.15 2.1 0.054 8.0 65.0 0.99340 3.90 0.56 13.1 4 1
8.8 0.24 0.54 2.5 0.083 25.0 57.0 0.99830 3.39 0.54 9.2 5 1
0.41 0.64 2.2 0.093 9.0 42.0 0.99860 3.54 0.66 10.5 5 1
0.40 0.40 2.2 0.079 19.0 52.0 0.99800 3.44 0.64 9.2 5 1
0.37 0.48 2.1 0.097 39.0 145.0 0.99750 3.04 1.03 9.3 5 1
..
7.4 0.36 0.30 1.8 0.074 17.0 24.0 0.99419 3.24 0.70 11.4 8 1
0.29 2.6 0.087 26.0 72.0 0.99645 3.39 0.68 11.0 5 1
0.35 0.33 2.4 0.068 9.0 26.0 0.99470 3.36 0.60 11.9 6 1
0.29 0.38 1.7 0.062 9.0 30.0 0.99680 3.41 0.53 9.5 6 1
15.9 0.36 0.65 7.5 0.096 22.0 71.0 0.99760 2.98 0.84 14.9 5 1
Length: 1359, dtype: int64
sd_dedup.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 1359 entries, 0 to 1598 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Parameter 1 1359 non-null float64 1 Parameter 2 1359 non-null float64 2 Parameter 3 1359 non-null float64 3 Parameter 4 1359 non-null float64 4 Parameter 5 1359 non-null float64 5 Parameter 6 1359 non-null float64 6 Parameter 7 1359 non-null float64 7 Parameter 8 1359 non-null float64 8 Parameter 9 1359 non-null float64 9 Parameter 10 1359 non-null float64 10 Parameter 11 1359 non-null float64 11 Signal_Strength 1359 non-null int64 dtypes: float64(11), int64(1) memory usage: 138.0 KB
# y is tharget variable created for signal strengt in the dataframe
y = sd_dedup['Signal_Strength']
# Creating a histogram of the target variable
sns.histplot(y, kde=False, bins=10)
plt.title('Histogram of Target Variable')
plt.xlabel('Target Variable')
plt.ylabel('Count')
plt.show()
# Create a density plot of the target variable
sns.kdeplot(y)
plt.title('Density Plot of Target Variable')
plt.xlabel('Target Variable')
plt.ylabel('Density')
plt.show()
#Signal strength 5 and 6 has been showing the best results of all the 11 parameters measured
# Create a pair plot of the predictor variables and the target variable
x=sd_dedup.iloc[:,:-1]
sns.pairplot(pd.concat([x, y], axis=1))
<seaborn.axisgrid.PairGrid at 0x7f5fc458d0a0>
#Parameter 1 & 8 are positively correlated
#Parameter 1 & 9 are negatively correlated
E. Share insights from the initial data analysis (at least 2).
# Compute the correlation matrix
sd_dedup_corr = sd_dedup.corr()
# Set the figure size
plt.subplots(figsize=(12, 8))
# Visualize the correlation matrix as a heatmap
sns.heatmap(sd_dedup_corr, cmap='RdBu', annot=True)
plt.title('Correlation Heatmap')
plt.show()
#Parameter 9 is negatively skewed to parameter 1 with highest value of 0.69
#Paramter 3 & 8 are positively skewed to Parameter 1
#Parameter 6 is also positively skewed to Parameter 7
# Both the above said positive correlation have highest values of 0.67
A. Split the data into X & Y.
x
| Parameter 1 | Parameter 2 | Parameter 3 | Parameter 4 | Parameter 5 | Parameter 6 | Parameter 7 | Parameter 8 | Parameter 9 | Parameter 10 | Parameter 11 | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 7.4 | 0.700 | 0.00 | 1.9 | 0.076 | 11.0 | 34.0 | 0.99780 | 3.51 | 0.56 | 9.4 |
| 1 | 7.8 | 0.880 | 0.00 | 2.6 | 0.098 | 25.0 | 67.0 | 0.99680 | 3.20 | 0.68 | 9.8 |
| 2 | 7.8 | 0.760 | 0.04 | 2.3 | 0.092 | 15.0 | 54.0 | 0.99700 | 3.26 | 0.65 | 9.8 |
| 3 | 11.2 | 0.280 | 0.56 | 1.9 | 0.075 | 17.0 | 60.0 | 0.99800 | 3.16 | 0.58 | 9.8 |
| 5 | 7.4 | 0.660 | 0.00 | 1.8 | 0.075 | 13.0 | 40.0 | 0.99780 | 3.51 | 0.56 | 9.4 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 1593 | 6.8 | 0.620 | 0.08 | 1.9 | 0.068 | 28.0 | 38.0 | 0.99651 | 3.42 | 0.82 | 9.5 |
| 1594 | 6.2 | 0.600 | 0.08 | 2.0 | 0.090 | 32.0 | 44.0 | 0.99490 | 3.45 | 0.58 | 10.5 |
| 1595 | 5.9 | 0.550 | 0.10 | 2.2 | 0.062 | 39.0 | 51.0 | 0.99512 | 3.52 | 0.76 | 11.2 |
| 1597 | 5.9 | 0.645 | 0.12 | 2.0 | 0.075 | 32.0 | 44.0 | 0.99547 | 3.57 | 0.71 | 10.2 |
| 1598 | 6.0 | 0.310 | 0.47 | 3.6 | 0.067 | 18.0 | 42.0 | 0.99549 | 3.39 | 0.66 | 11.0 |
1359 rows × 11 columns
y.value_counts()
5 577 6 535 7 167 4 53 8 17 3 10 Name: Signal_Strength, dtype: int64
x.shape
(1359, 11)
y.shape
(1359,)
# Converting y to catergorical
yc=to_categorical(y,num_classes=12)
B. Split the data into train & test with 70:30 proportion.
from sklearn.model_selection import train_test_split
# X represents the feature variables, y represents the target variable
x_train, x_test, y_train, y_test = train_test_split(x, yc, test_size=0.3, random_state=42)
C. Print shape of all the 4 variables and verify if train and test data is in sync.
# Print the shape of the four variables
print("x_train shape:", x_train.shape)
print("x_test shape:", x_test.shape)
print("y_train shape:", y_train.shape)
print("y_test shape:", y_test.shape)
# Verify if train and test data are in sync
assert x_train.shape[0] == y_train.shape[0], "Error: x_train and y_train have different number of samples"
assert x_test.shape[0] == y_test.shape[0], "Error: x_test and y_test have different number of samples"
x_train shape: (951, 11) x_test shape: (408, 11) y_train shape: (951, 12) y_test shape: (408, 12)
D. Normalise the train and test data with appropriate method
x_train.min()
Parameter 1 4.60000 Parameter 2 0.16000 Parameter 3 0.00000 Parameter 4 0.90000 Parameter 5 0.01200 Parameter 6 2.00000 Parameter 7 6.00000 Parameter 8 0.99007 Parameter 9 2.86000 Parameter 10 0.33000 Parameter 11 8.40000 dtype: float64
x_train.max()
Parameter 1 15.9000 Parameter 2 1.5800 Parameter 3 0.7800 Parameter 4 13.9000 Parameter 5 0.6110 Parameter 6 72.0000 Parameter 7 289.0000 Parameter 8 1.0032 Parameter 9 3.9000 Parameter 10 1.9800 Parameter 11 14.9000 dtype: float64
from sklearn.preprocessing import MinMaxScaler
# Create a scaler object
sd_scaler = MinMaxScaler()
# Fit the scaler object on the training data
sd_scaler.fit(x_train)
# Transform the training and testing data using the scaler object
x_train_norm = sd_scaler.transform(x_train)
x_test_norm = sd_scaler.transform(x_test)
x_train_norm.min()
0.0
x_train_norm.max()
1.0000000000000002
x_test_norm.min()
-0.11538461538461497
x_test_norm.max()
1.282051282051282
E. Transform Labels into format acceptable by Neural Network
print(y_train.shape)
print(y_test.shape)
(951, 12) (408, 12)
sd_cls_model = Sequential()
sd_cls_model.add(Dense(8, activation='tanh'))
sd_cls_model.add(Dense(10, activation='tanh'))
sd_cls_model.add(Dense(12, activation='sigmoid'))
# Compile the model
sd_cls_model.compile(loss="categorical_crossentropy", metrics=["accuracy"], optimizer="Adam")
# Fit the model
history=sd_cls_model.fit(x_train, y_train, batch_size=20, epochs=50, validation_data=(x_test, y_test))
Epoch 1/50 48/48 [==============================] - 1s 7ms/step - loss: 2.3254 - accuracy: 0.0431 - val_loss: 1.9616 - val_accuracy: 0.4510 Epoch 2/50 48/48 [==============================] - 0s 4ms/step - loss: 1.7661 - accuracy: 0.4595 - val_loss: 1.5556 - val_accuracy: 0.4828 Epoch 3/50 48/48 [==============================] - 0s 4ms/step - loss: 1.4755 - accuracy: 0.4911 - val_loss: 1.3650 - val_accuracy: 0.5000 Epoch 4/50 48/48 [==============================] - 0s 4ms/step - loss: 1.3436 - accuracy: 0.4974 - val_loss: 1.2802 - val_accuracy: 0.4926 Epoch 5/50 48/48 [==============================] - 0s 3ms/step - loss: 1.2830 - accuracy: 0.4932 - val_loss: 1.2310 - val_accuracy: 0.5123 Epoch 6/50 48/48 [==============================] - 0s 3ms/step - loss: 1.2483 - accuracy: 0.5005 - val_loss: 1.2067 - val_accuracy: 0.5074 Epoch 7/50 48/48 [==============================] - 0s 4ms/step - loss: 1.2281 - accuracy: 0.4911 - val_loss: 1.1957 - val_accuracy: 0.4828 Epoch 8/50 48/48 [==============================] - 0s 4ms/step - loss: 1.2134 - accuracy: 0.4953 - val_loss: 1.1772 - val_accuracy: 0.5123 Epoch 9/50 48/48 [==============================] - 0s 3ms/step - loss: 1.2021 - accuracy: 0.4932 - val_loss: 1.1668 - val_accuracy: 0.5147 Epoch 10/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1944 - accuracy: 0.4953 - val_loss: 1.1629 - val_accuracy: 0.4951 Epoch 11/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1875 - accuracy: 0.4942 - val_loss: 1.1556 - val_accuracy: 0.5074 Epoch 12/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1819 - accuracy: 0.4984 - val_loss: 1.1494 - val_accuracy: 0.5098 Epoch 13/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1766 - accuracy: 0.4995 - val_loss: 1.1491 - val_accuracy: 0.4951 Epoch 14/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1718 - accuracy: 0.5058 - val_loss: 1.1441 - val_accuracy: 0.5025 Epoch 15/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1701 - accuracy: 0.4963 - val_loss: 1.1388 - val_accuracy: 0.5147 Epoch 16/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1659 - accuracy: 0.5005 - val_loss: 1.1400 - val_accuracy: 0.5025 Epoch 17/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1635 - accuracy: 0.5047 - val_loss: 1.1382 - val_accuracy: 0.5025 Epoch 18/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1613 - accuracy: 0.5026 - val_loss: 1.1371 - val_accuracy: 0.5049 Epoch 19/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1587 - accuracy: 0.5047 - val_loss: 1.1397 - val_accuracy: 0.4902 Epoch 20/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1568 - accuracy: 0.5026 - val_loss: 1.1337 - val_accuracy: 0.5049 Epoch 21/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1550 - accuracy: 0.5068 - val_loss: 1.1316 - val_accuracy: 0.5074 Epoch 22/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1528 - accuracy: 0.5058 - val_loss: 1.1317 - val_accuracy: 0.5049 Epoch 23/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1515 - accuracy: 0.5068 - val_loss: 1.1319 - val_accuracy: 0.5025 Epoch 24/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1499 - accuracy: 0.5037 - val_loss: 1.1307 - val_accuracy: 0.4877 Epoch 25/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1483 - accuracy: 0.5110 - val_loss: 1.1251 - val_accuracy: 0.5098 Epoch 26/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1488 - accuracy: 0.5058 - val_loss: 1.1286 - val_accuracy: 0.4853 Epoch 27/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1464 - accuracy: 0.5142 - val_loss: 1.1270 - val_accuracy: 0.5147 Epoch 28/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1435 - accuracy: 0.5037 - val_loss: 1.1324 - val_accuracy: 0.4853 Epoch 29/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1452 - accuracy: 0.5100 - val_loss: 1.1281 - val_accuracy: 0.4779 Epoch 30/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1430 - accuracy: 0.5131 - val_loss: 1.1268 - val_accuracy: 0.4902 Epoch 31/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1403 - accuracy: 0.5163 - val_loss: 1.1210 - val_accuracy: 0.5025 Epoch 32/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1393 - accuracy: 0.5121 - val_loss: 1.1245 - val_accuracy: 0.4877 Epoch 33/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1403 - accuracy: 0.5089 - val_loss: 1.1250 - val_accuracy: 0.4877 Epoch 34/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1384 - accuracy: 0.5110 - val_loss: 1.1173 - val_accuracy: 0.5098 Epoch 35/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1374 - accuracy: 0.5226 - val_loss: 1.1288 - val_accuracy: 0.4804 Epoch 36/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1363 - accuracy: 0.5131 - val_loss: 1.1208 - val_accuracy: 0.5025 Epoch 37/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1350 - accuracy: 0.5152 - val_loss: 1.1238 - val_accuracy: 0.4926 Epoch 38/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1347 - accuracy: 0.5121 - val_loss: 1.1161 - val_accuracy: 0.5123 Epoch 39/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1333 - accuracy: 0.5100 - val_loss: 1.1150 - val_accuracy: 0.5123 Epoch 40/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1319 - accuracy: 0.5247 - val_loss: 1.1199 - val_accuracy: 0.5025 Epoch 41/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1306 - accuracy: 0.5184 - val_loss: 1.1158 - val_accuracy: 0.5098 Epoch 42/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1290 - accuracy: 0.5226 - val_loss: 1.1149 - val_accuracy: 0.5098 Epoch 43/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1277 - accuracy: 0.5152 - val_loss: 1.1143 - val_accuracy: 0.5123 Epoch 44/50 48/48 [==============================] - 0s 3ms/step - loss: 1.1283 - accuracy: 0.5195 - val_loss: 1.1181 - val_accuracy: 0.4902 Epoch 45/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1269 - accuracy: 0.5163 - val_loss: 1.1102 - val_accuracy: 0.5098 Epoch 46/50 48/48 [==============================] - 0s 5ms/step - loss: 1.1254 - accuracy: 0.5342 - val_loss: 1.1190 - val_accuracy: 0.4853 Epoch 47/50 48/48 [==============================] - 0s 5ms/step - loss: 1.1264 - accuracy: 0.5226 - val_loss: 1.1172 - val_accuracy: 0.4951 Epoch 48/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1259 - accuracy: 0.5237 - val_loss: 1.1139 - val_accuracy: 0.4926 Epoch 49/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1224 - accuracy: 0.5384 - val_loss: 1.1142 - val_accuracy: 0.4926 Epoch 50/50 48/48 [==============================] - 0s 4ms/step - loss: 1.1216 - accuracy: 0.5258 - val_loss: 1.1088 - val_accuracy: 0.5147
i. Training Loss and Validation Loss
ii. Training Accuracy and Validation Accuracy
# Plot the training and validation loss
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Training Loss vs. Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend(['Training Loss', 'Validation Loss'])
plt.show()
# Plot the training and validation accuracy
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('Training Accuracy vs. Validation Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend(['Training Accuracy', 'Validation Accuracy'])
plt.show()
#Adding batch normalization and dropout after each dense layer except eh output layer to imporve performance of the model
from tensorflow.keras.layers import Dense, Dropout, BatchNormalization
sd_cls_model_v1 = Sequential()
sd_cls_model_v1.add(Dense(128, activation='relu'))
sd_cls_model_v1.add(BatchNormalization())
sd_cls_model_v1.add(Dropout(0.2))
sd_cls_model_v1.add(Dense(64, activation='relu'))
sd_cls_model_v1.add(BatchNormalization())
sd_cls_model_v1.add(Dropout(0.2))
sd_cls_model_v1.add(Dense(32, activation='relu'))
sd_cls_model_v1.add(BatchNormalization())
sd_cls_model_v1.add(Dropout(0.2))
sd_cls_model_v1.add(Dense(16, activation='relu'))
sd_cls_model_v1.add(BatchNormalization())
sd_cls_model_v1.add(Dropout(0.2))
sd_cls_model_v1.add(Dense(12, activation='softmax'))
# Compile the model
sd_cls_model_v1.compile(loss="categorical_crossentropy", metrics=["accuracy"], optimizer="sgd")
# Fit the model
history_v1=sd_cls_model_v1.fit(x_train, y_train, batch_size=24, epochs=100, validation_data=(x_test, y_test))
Epoch 1/100 40/40 [==============================] - 2s 12ms/step - loss: 2.9542 - accuracy: 0.0715 - val_loss: 2.4001 - val_accuracy: 0.0196 Epoch 2/100 40/40 [==============================] - 0s 5ms/step - loss: 2.5906 - accuracy: 0.1272 - val_loss: 2.1936 - val_accuracy: 0.3775 Epoch 3/100 40/40 [==============================] - 0s 5ms/step - loss: 2.3475 - accuracy: 0.2156 - val_loss: 2.0636 - val_accuracy: 0.4583 Epoch 4/100 40/40 [==============================] - 0s 4ms/step - loss: 2.0819 - accuracy: 0.3239 - val_loss: 1.8929 - val_accuracy: 0.4828 Epoch 5/100 40/40 [==============================] - 0s 5ms/step - loss: 1.9669 - accuracy: 0.3586 - val_loss: 1.6899 - val_accuracy: 0.5147 Epoch 6/100 40/40 [==============================] - 0s 5ms/step - loss: 1.8525 - accuracy: 0.4206 - val_loss: 1.6652 - val_accuracy: 0.5196 Epoch 7/100 40/40 [==============================] - 0s 5ms/step - loss: 1.7433 - accuracy: 0.4479 - val_loss: 1.5482 - val_accuracy: 0.5098 Epoch 8/100 40/40 [==============================] - 0s 6ms/step - loss: 1.6325 - accuracy: 0.4448 - val_loss: 1.4497 - val_accuracy: 0.5221 Epoch 9/100 40/40 [==============================] - 0s 5ms/step - loss: 1.5478 - accuracy: 0.4606 - val_loss: 1.4078 - val_accuracy: 0.4926 Epoch 10/100 40/40 [==============================] - 0s 4ms/step - loss: 1.4911 - accuracy: 0.4932 - val_loss: 1.3965 - val_accuracy: 0.4926 Epoch 11/100 40/40 [==============================] - 0s 5ms/step - loss: 1.4315 - accuracy: 0.4953 - val_loss: 1.4401 - val_accuracy: 0.4902 Epoch 12/100 40/40 [==============================] - 0s 5ms/step - loss: 1.4193 - accuracy: 0.4648 - val_loss: 1.3582 - val_accuracy: 0.5074 Epoch 13/100 40/40 [==============================] - 0s 5ms/step - loss: 1.3744 - accuracy: 0.4890 - val_loss: 1.2668 - val_accuracy: 0.5074 Epoch 14/100 40/40 [==============================] - 0s 5ms/step - loss: 1.3239 - accuracy: 0.4942 - val_loss: 1.2858 - val_accuracy: 0.4902 Epoch 15/100 40/40 [==============================] - 0s 5ms/step - loss: 1.3467 - accuracy: 0.5005 - val_loss: 1.3083 - val_accuracy: 0.5000 Epoch 16/100 40/40 [==============================] - 0s 5ms/step - loss: 1.3191 - accuracy: 0.4911 - val_loss: 1.3005 - val_accuracy: 0.5074 Epoch 17/100 40/40 [==============================] - 0s 5ms/step - loss: 1.3247 - accuracy: 0.4911 - val_loss: 1.2372 - val_accuracy: 0.4853 Epoch 18/100 40/40 [==============================] - 0s 5ms/step - loss: 1.2796 - accuracy: 0.4795 - val_loss: 1.2122 - val_accuracy: 0.5098 Epoch 19/100 40/40 [==============================] - 0s 5ms/step - loss: 1.2749 - accuracy: 0.5026 - val_loss: 1.1874 - val_accuracy: 0.5245 Epoch 20/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2599 - accuracy: 0.5226 - val_loss: 1.2069 - val_accuracy: 0.4926 Epoch 21/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2318 - accuracy: 0.5216 - val_loss: 1.2615 - val_accuracy: 0.5025 Epoch 22/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2465 - accuracy: 0.5068 - val_loss: 1.1652 - val_accuracy: 0.5221 Epoch 23/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2584 - accuracy: 0.5195 - val_loss: 1.1725 - val_accuracy: 0.5466 Epoch 24/100 40/40 [==============================] - 0s 5ms/step - loss: 1.2153 - accuracy: 0.5237 - val_loss: 1.1956 - val_accuracy: 0.5147 Epoch 25/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2063 - accuracy: 0.5184 - val_loss: 1.1510 - val_accuracy: 0.5392 Epoch 26/100 40/40 [==============================] - 0s 4ms/step - loss: 1.2177 - accuracy: 0.5216 - val_loss: 1.1598 - val_accuracy: 0.5172 Epoch 27/100 40/40 [==============================] - 0s 5ms/step - loss: 1.2218 - accuracy: 0.5300 - val_loss: 1.1565 - val_accuracy: 0.5049 Epoch 28/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1929 - accuracy: 0.5195 - val_loss: 1.2180 - val_accuracy: 0.5000 Epoch 29/100 40/40 [==============================] - 0s 7ms/step - loss: 1.2043 - accuracy: 0.5184 - val_loss: 1.1328 - val_accuracy: 0.5196 Epoch 30/100 40/40 [==============================] - 0s 7ms/step - loss: 1.1699 - accuracy: 0.5342 - val_loss: 1.1893 - val_accuracy: 0.5147 Epoch 31/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1917 - accuracy: 0.5163 - val_loss: 1.1102 - val_accuracy: 0.5196 Epoch 32/100 40/40 [==============================] - 0s 7ms/step - loss: 1.1602 - accuracy: 0.5152 - val_loss: 1.1265 - val_accuracy: 0.5196 Epoch 33/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1859 - accuracy: 0.5216 - val_loss: 1.1217 - val_accuracy: 0.5490 Epoch 34/100 40/40 [==============================] - 0s 7ms/step - loss: 1.2005 - accuracy: 0.5100 - val_loss: 1.1059 - val_accuracy: 0.5172 Epoch 35/100 40/40 [==============================] - 0s 7ms/step - loss: 1.1654 - accuracy: 0.5289 - val_loss: 1.1183 - val_accuracy: 0.5147 Epoch 36/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1543 - accuracy: 0.5489 - val_loss: 1.0972 - val_accuracy: 0.5221 Epoch 37/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1396 - accuracy: 0.5310 - val_loss: 1.2657 - val_accuracy: 0.4461 Epoch 38/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1633 - accuracy: 0.5300 - val_loss: 1.0875 - val_accuracy: 0.5686 Epoch 39/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1507 - accuracy: 0.5394 - val_loss: 1.0893 - val_accuracy: 0.5613 Epoch 40/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1573 - accuracy: 0.5342 - val_loss: 1.0785 - val_accuracy: 0.5686 Epoch 41/100 40/40 [==============================] - 0s 7ms/step - loss: 1.1507 - accuracy: 0.5363 - val_loss: 1.1163 - val_accuracy: 0.5515 Epoch 42/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1759 - accuracy: 0.5279 - val_loss: 1.0873 - val_accuracy: 0.5490 Epoch 43/100 40/40 [==============================] - 0s 7ms/step - loss: 1.1176 - accuracy: 0.5447 - val_loss: 1.0968 - val_accuracy: 0.5417 Epoch 44/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1290 - accuracy: 0.5384 - val_loss: 1.2083 - val_accuracy: 0.4730 Epoch 45/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1450 - accuracy: 0.5289 - val_loss: 1.1470 - val_accuracy: 0.5123 Epoch 46/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1288 - accuracy: 0.5468 - val_loss: 1.1660 - val_accuracy: 0.4853 Epoch 47/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1407 - accuracy: 0.5405 - val_loss: 1.2065 - val_accuracy: 0.4608 Epoch 48/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1166 - accuracy: 0.5426 - val_loss: 1.0894 - val_accuracy: 0.5294 Epoch 49/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1349 - accuracy: 0.5394 - val_loss: 1.1433 - val_accuracy: 0.4926 Epoch 50/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1296 - accuracy: 0.5363 - val_loss: 1.0694 - val_accuracy: 0.5441 Epoch 51/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1345 - accuracy: 0.5363 - val_loss: 1.0658 - val_accuracy: 0.5686 Epoch 52/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1208 - accuracy: 0.5563 - val_loss: 1.1554 - val_accuracy: 0.4779 Epoch 53/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1334 - accuracy: 0.5394 - val_loss: 1.0845 - val_accuracy: 0.5441 Epoch 54/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1161 - accuracy: 0.5373 - val_loss: 1.0601 - val_accuracy: 0.5686 Epoch 55/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1241 - accuracy: 0.5384 - val_loss: 1.0786 - val_accuracy: 0.5490 Epoch 56/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1193 - accuracy: 0.5573 - val_loss: 1.0497 - val_accuracy: 0.5711 Epoch 57/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1066 - accuracy: 0.5689 - val_loss: 1.0461 - val_accuracy: 0.5711 Epoch 58/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1172 - accuracy: 0.5478 - val_loss: 1.0425 - val_accuracy: 0.5662 Epoch 59/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1161 - accuracy: 0.5415 - val_loss: 1.1074 - val_accuracy: 0.5319 Epoch 60/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1165 - accuracy: 0.5584 - val_loss: 1.0585 - val_accuracy: 0.5858 Epoch 61/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1177 - accuracy: 0.5258 - val_loss: 1.0598 - val_accuracy: 0.5760 Epoch 62/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1120 - accuracy: 0.5405 - val_loss: 1.0505 - val_accuracy: 0.5637 Epoch 63/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1045 - accuracy: 0.5468 - val_loss: 1.0793 - val_accuracy: 0.5343 Epoch 64/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1281 - accuracy: 0.5405 - val_loss: 1.0777 - val_accuracy: 0.5417 Epoch 65/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1155 - accuracy: 0.5426 - val_loss: 1.0554 - val_accuracy: 0.5686 Epoch 66/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1182 - accuracy: 0.5499 - val_loss: 1.1976 - val_accuracy: 0.4436 Epoch 67/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1245 - accuracy: 0.5489 - val_loss: 1.3740 - val_accuracy: 0.4044 Epoch 68/100 40/40 [==============================] - 0s 6ms/step - loss: 1.1217 - accuracy: 0.5447 - val_loss: 1.0479 - val_accuracy: 0.5735 Epoch 69/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1244 - accuracy: 0.5331 - val_loss: 1.0954 - val_accuracy: 0.5417 Epoch 70/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0942 - accuracy: 0.5647 - val_loss: 1.0595 - val_accuracy: 0.5417 Epoch 71/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0997 - accuracy: 0.5384 - val_loss: 1.0679 - val_accuracy: 0.5515 Epoch 72/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1031 - accuracy: 0.5447 - val_loss: 1.1082 - val_accuracy: 0.4975 Epoch 73/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1018 - accuracy: 0.5447 - val_loss: 1.0365 - val_accuracy: 0.5735 Epoch 74/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1002 - accuracy: 0.5415 - val_loss: 1.0398 - val_accuracy: 0.5760 Epoch 75/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0845 - accuracy: 0.5573 - val_loss: 1.0703 - val_accuracy: 0.5490 Epoch 76/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0940 - accuracy: 0.5394 - val_loss: 1.0619 - val_accuracy: 0.5613 Epoch 77/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1023 - accuracy: 0.5289 - val_loss: 1.0835 - val_accuracy: 0.5441 Epoch 78/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0976 - accuracy: 0.5331 - val_loss: 1.0516 - val_accuracy: 0.5515 Epoch 79/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0961 - accuracy: 0.5563 - val_loss: 1.0463 - val_accuracy: 0.5588 Epoch 80/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1126 - accuracy: 0.5447 - val_loss: 1.1274 - val_accuracy: 0.5221 Epoch 81/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0923 - accuracy: 0.5478 - val_loss: 1.0543 - val_accuracy: 0.5858 Epoch 82/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0944 - accuracy: 0.5510 - val_loss: 1.0537 - val_accuracy: 0.5662 Epoch 83/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1075 - accuracy: 0.5478 - val_loss: 1.0722 - val_accuracy: 0.5392 Epoch 84/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1194 - accuracy: 0.5279 - val_loss: 1.0438 - val_accuracy: 0.5662 Epoch 85/100 40/40 [==============================] - 0s 5ms/step - loss: 1.1101 - accuracy: 0.5657 - val_loss: 1.1457 - val_accuracy: 0.4804 Epoch 86/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0824 - accuracy: 0.5647 - val_loss: 1.0469 - val_accuracy: 0.5343 Epoch 87/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0796 - accuracy: 0.5605 - val_loss: 1.0402 - val_accuracy: 0.5711 Epoch 88/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1009 - accuracy: 0.5373 - val_loss: 1.0278 - val_accuracy: 0.5711 Epoch 89/100 40/40 [==============================] - 0s 4ms/step - loss: 1.1024 - accuracy: 0.5594 - val_loss: 1.2484 - val_accuracy: 0.4485 Epoch 90/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0958 - accuracy: 0.5342 - val_loss: 1.0767 - val_accuracy: 0.5539 Epoch 91/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0849 - accuracy: 0.5499 - val_loss: 1.1669 - val_accuracy: 0.4681 Epoch 92/100 40/40 [==============================] - 0s 5ms/step - loss: 1.0935 - accuracy: 0.5521 - val_loss: 1.0313 - val_accuracy: 0.5760 Epoch 93/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0888 - accuracy: 0.5615 - val_loss: 1.0340 - val_accuracy: 0.5564 Epoch 94/100 40/40 [==============================] - 0s 4ms/step - loss: 1.0949 - accuracy: 0.5363 - val_loss: 1.0664 - val_accuracy: 0.5588 Epoch 95/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0811 - accuracy: 0.5436 - val_loss: 1.0184 - val_accuracy: 0.5662 Epoch 96/100 40/40 [==============================] - 0s 7ms/step - loss: 1.0971 - accuracy: 0.5342 - val_loss: 1.0322 - val_accuracy: 0.5784 Epoch 97/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0825 - accuracy: 0.5478 - val_loss: 1.0394 - val_accuracy: 0.5809 Epoch 98/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0797 - accuracy: 0.5384 - val_loss: 1.0524 - val_accuracy: 0.5760 Epoch 99/100 40/40 [==============================] - 0s 6ms/step - loss: 1.0878 - accuracy: 0.5384 - val_loss: 1.0404 - val_accuracy: 0.5833 Epoch 100/100 40/40 [==============================] - 0s 7ms/step - loss: 1.0869 - accuracy: 0.5626 - val_loss: 1.1001 - val_accuracy: 0.5123
# Plot the training and validation loss
plt.plot(history_v1.history['loss'])
plt.plot(history_v1.history['val_loss'])
plt.title('Training Loss vs. Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend(['Training Loss', 'Validation Loss'])
plt.show()
# Plot the training and validation accuracy
plt.plot(history_v1.history['accuracy'])
plt.plot(history_v1.history['val_accuracy'])
plt.title('Training Accuracy vs. Validation Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend(['Training Accuracy', 'Validation Accuracy'])
plt.show()
#The above model provided a slightly better accuracy.
#However, adding more dense layers, with less filters to improve accuracy
sd_cls_model_v2 = Sequential()
sd_cls_model_v2.add(Dense(32, activation='relu'))
sd_cls_model_v2.add(BatchNormalization())
sd_cls_model_v2.add(Dropout(0.2))
sd_cls_model_v2.add(Dense(28, activation='relu'))
sd_cls_model_v2.add(BatchNormalization())
sd_cls_model_v2.add(Dropout(0.2))
sd_cls_model_v2.add(Dense(24, activation='relu'))
sd_cls_model_v2.add(BatchNormalization())
sd_cls_model_v2.add(Dropout(0.2))
sd_cls_model_v2.add(Dense(12, activation='sigmoid'))
# Compile the model
sd_cls_model_v2.compile(loss="categorical_crossentropy", metrics=["accuracy"], optimizer="Adam")
Adding more epochs from 100 to 250 to imporve accuracy further
# Fit the model
history_v2=sd_cls_model_v2.fit(x_train, y_train, batch_size=24, epochs=250, validation_data=(x_test, y_test))
Epoch 1/250 40/40 [==============================] - 17s 11ms/step - loss: 2.9911 - accuracy: 0.0736 - val_loss: 2.5458 - val_accuracy: 0.0466 Epoch 2/250 40/40 [==============================] - 0s 4ms/step - loss: 2.5882 - accuracy: 0.1598 - val_loss: 2.1544 - val_accuracy: 0.4804 Epoch 3/250 40/40 [==============================] - 0s 5ms/step - loss: 2.2925 - accuracy: 0.2671 - val_loss: 1.9656 - val_accuracy: 0.5147 Epoch 4/250 40/40 [==============================] - 0s 5ms/step - loss: 2.1127 - accuracy: 0.3596 - val_loss: 1.8172 - val_accuracy: 0.5539 Epoch 5/250 40/40 [==============================] - 0s 5ms/step - loss: 1.9774 - accuracy: 0.3996 - val_loss: 1.6926 - val_accuracy: 0.5221 Epoch 6/250 40/40 [==============================] - 0s 5ms/step - loss: 1.8435 - accuracy: 0.4385 - val_loss: 1.5853 - val_accuracy: 0.5049 Epoch 7/250 40/40 [==============================] - 0s 5ms/step - loss: 1.6772 - accuracy: 0.4763 - val_loss: 1.4850 - val_accuracy: 0.5098 Epoch 8/250 40/40 [==============================] - 0s 4ms/step - loss: 1.6126 - accuracy: 0.4805 - val_loss: 1.4003 - val_accuracy: 0.4975 Epoch 9/250 40/40 [==============================] - 0s 5ms/step - loss: 1.4607 - accuracy: 0.5205 - val_loss: 1.3065 - val_accuracy: 0.5172 Epoch 10/250 40/40 [==============================] - 0s 5ms/step - loss: 1.4628 - accuracy: 0.4700 - val_loss: 1.2555 - val_accuracy: 0.5098 Epoch 11/250 40/40 [==============================] - 0s 5ms/step - loss: 1.3842 - accuracy: 0.4900 - val_loss: 1.2165 - val_accuracy: 0.5074 Epoch 12/250 40/40 [==============================] - 0s 5ms/step - loss: 1.3508 - accuracy: 0.4690 - val_loss: 1.1826 - val_accuracy: 0.5196 Epoch 13/250 40/40 [==============================] - 0s 4ms/step - loss: 1.2810 - accuracy: 0.5079 - val_loss: 1.1540 - val_accuracy: 0.5466 Epoch 14/250 40/40 [==============================] - 0s 5ms/step - loss: 1.2583 - accuracy: 0.4995 - val_loss: 1.1347 - val_accuracy: 0.5466 Epoch 15/250 40/40 [==============================] - 0s 4ms/step - loss: 1.2173 - accuracy: 0.5152 - val_loss: 1.1077 - val_accuracy: 0.5564 Epoch 16/250 40/40 [==============================] - 0s 5ms/step - loss: 1.1767 - accuracy: 0.5321 - val_loss: 1.1056 - val_accuracy: 0.5392 Epoch 17/250 40/40 [==============================] - 0s 5ms/step - loss: 1.1835 - accuracy: 0.5089 - val_loss: 1.1012 - val_accuracy: 0.5490 Epoch 18/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1556 - accuracy: 0.5279 - val_loss: 1.1182 - val_accuracy: 0.5392 Epoch 19/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1423 - accuracy: 0.5289 - val_loss: 1.0790 - val_accuracy: 0.5539 Epoch 20/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1467 - accuracy: 0.5331 - val_loss: 1.0748 - val_accuracy: 0.5637 Epoch 21/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1186 - accuracy: 0.5426 - val_loss: 1.0688 - val_accuracy: 0.5637 Epoch 22/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1071 - accuracy: 0.5447 - val_loss: 1.0777 - val_accuracy: 0.5662 Epoch 23/250 40/40 [==============================] - 0s 6ms/step - loss: 1.1511 - accuracy: 0.5394 - val_loss: 1.0769 - val_accuracy: 0.5515 Epoch 24/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0703 - accuracy: 0.5563 - val_loss: 1.0676 - val_accuracy: 0.5662 Epoch 25/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0989 - accuracy: 0.5563 - val_loss: 1.0764 - val_accuracy: 0.5564 Epoch 26/250 40/40 [==============================] - 0s 7ms/step - loss: 1.1001 - accuracy: 0.5636 - val_loss: 1.0552 - val_accuracy: 0.5588 Epoch 27/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0999 - accuracy: 0.5405 - val_loss: 1.0628 - val_accuracy: 0.5441 Epoch 28/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0876 - accuracy: 0.5363 - val_loss: 1.0397 - val_accuracy: 0.5809 Epoch 29/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0883 - accuracy: 0.5447 - val_loss: 1.0371 - val_accuracy: 0.5711 Epoch 30/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0779 - accuracy: 0.5195 - val_loss: 1.0189 - val_accuracy: 0.5833 Epoch 31/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0605 - accuracy: 0.5647 - val_loss: 1.0179 - val_accuracy: 0.5637 Epoch 32/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0716 - accuracy: 0.5542 - val_loss: 1.0096 - val_accuracy: 0.5809 Epoch 33/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0782 - accuracy: 0.5594 - val_loss: 1.0228 - val_accuracy: 0.5735 Epoch 34/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0850 - accuracy: 0.5384 - val_loss: 1.0097 - val_accuracy: 0.5882 Epoch 35/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0614 - accuracy: 0.5699 - val_loss: 1.0088 - val_accuracy: 0.5686 Epoch 36/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0704 - accuracy: 0.5521 - val_loss: 1.0191 - val_accuracy: 0.5784 Epoch 37/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0791 - accuracy: 0.5426 - val_loss: 1.0082 - val_accuracy: 0.5882 Epoch 38/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0614 - accuracy: 0.5489 - val_loss: 1.0088 - val_accuracy: 0.5735 Epoch 39/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0521 - accuracy: 0.5552 - val_loss: 1.0020 - val_accuracy: 0.5858 Epoch 40/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0562 - accuracy: 0.5605 - val_loss: 1.0092 - val_accuracy: 0.5833 Epoch 41/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0547 - accuracy: 0.5531 - val_loss: 0.9988 - val_accuracy: 0.5784 Epoch 42/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0508 - accuracy: 0.5584 - val_loss: 0.9983 - val_accuracy: 0.5833 Epoch 43/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0230 - accuracy: 0.5689 - val_loss: 0.9931 - val_accuracy: 0.5809 Epoch 44/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0500 - accuracy: 0.5468 - val_loss: 0.9981 - val_accuracy: 0.5784 Epoch 45/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0582 - accuracy: 0.5605 - val_loss: 0.9959 - val_accuracy: 0.5784 Epoch 46/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0495 - accuracy: 0.5531 - val_loss: 1.0029 - val_accuracy: 0.5809 Epoch 47/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0439 - accuracy: 0.5415 - val_loss: 1.0131 - val_accuracy: 0.5833 Epoch 48/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0503 - accuracy: 0.5668 - val_loss: 1.0037 - val_accuracy: 0.5613 Epoch 49/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0458 - accuracy: 0.5521 - val_loss: 0.9981 - val_accuracy: 0.5833 Epoch 50/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0517 - accuracy: 0.5552 - val_loss: 0.9947 - val_accuracy: 0.5833 Epoch 51/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0339 - accuracy: 0.5531 - val_loss: 0.9979 - val_accuracy: 0.5858 Epoch 52/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0415 - accuracy: 0.5594 - val_loss: 0.9819 - val_accuracy: 0.5833 Epoch 53/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0345 - accuracy: 0.5626 - val_loss: 0.9756 - val_accuracy: 0.5833 Epoch 54/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0039 - accuracy: 0.5857 - val_loss: 0.9842 - val_accuracy: 0.5980 Epoch 55/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0280 - accuracy: 0.5584 - val_loss: 0.9675 - val_accuracy: 0.5784 Epoch 56/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0106 - accuracy: 0.5794 - val_loss: 0.9683 - val_accuracy: 0.5907 Epoch 57/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0191 - accuracy: 0.5794 - val_loss: 0.9685 - val_accuracy: 0.5833 Epoch 58/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0180 - accuracy: 0.5426 - val_loss: 0.9655 - val_accuracy: 0.5956 Epoch 59/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0369 - accuracy: 0.5384 - val_loss: 0.9755 - val_accuracy: 0.6005 Epoch 60/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0128 - accuracy: 0.5668 - val_loss: 0.9707 - val_accuracy: 0.5907 Epoch 61/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0258 - accuracy: 0.5741 - val_loss: 0.9767 - val_accuracy: 0.5931 Epoch 62/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0221 - accuracy: 0.5720 - val_loss: 0.9712 - val_accuracy: 0.5907 Epoch 63/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0152 - accuracy: 0.5668 - val_loss: 0.9816 - val_accuracy: 0.5907 Epoch 64/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0286 - accuracy: 0.5636 - val_loss: 0.9812 - val_accuracy: 0.5907 Epoch 65/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0215 - accuracy: 0.5510 - val_loss: 0.9750 - val_accuracy: 0.5833 Epoch 66/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0221 - accuracy: 0.5762 - val_loss: 0.9815 - val_accuracy: 0.5858 Epoch 67/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0262 - accuracy: 0.5668 - val_loss: 0.9765 - val_accuracy: 0.5907 Epoch 68/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0255 - accuracy: 0.5521 - val_loss: 0.9743 - val_accuracy: 0.5784 Epoch 69/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0162 - accuracy: 0.5531 - val_loss: 0.9909 - val_accuracy: 0.5882 Epoch 70/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0059 - accuracy: 0.5668 - val_loss: 0.9745 - val_accuracy: 0.5882 Epoch 71/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0057 - accuracy: 0.5710 - val_loss: 0.9763 - val_accuracy: 0.5931 Epoch 72/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0069 - accuracy: 0.5626 - val_loss: 0.9751 - val_accuracy: 0.5833 Epoch 73/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0047 - accuracy: 0.5762 - val_loss: 0.9634 - val_accuracy: 0.5907 Epoch 74/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0128 - accuracy: 0.5584 - val_loss: 0.9640 - val_accuracy: 0.5809 Epoch 75/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0108 - accuracy: 0.5605 - val_loss: 0.9637 - val_accuracy: 0.5833 Epoch 76/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0164 - accuracy: 0.5563 - val_loss: 0.9705 - val_accuracy: 0.5931 Epoch 77/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0281 - accuracy: 0.5584 - val_loss: 0.9928 - val_accuracy: 0.5882 Epoch 78/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0208 - accuracy: 0.5573 - val_loss: 0.9712 - val_accuracy: 0.6078 Epoch 79/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9971 - accuracy: 0.5710 - val_loss: 0.9910 - val_accuracy: 0.5931 Epoch 80/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0470 - accuracy: 0.5468 - val_loss: 0.9741 - val_accuracy: 0.6054 Epoch 81/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0313 - accuracy: 0.5605 - val_loss: 0.9653 - val_accuracy: 0.6127 Epoch 82/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0084 - accuracy: 0.5584 - val_loss: 0.9836 - val_accuracy: 0.5980 Epoch 83/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0108 - accuracy: 0.5521 - val_loss: 0.9725 - val_accuracy: 0.5980 Epoch 84/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0134 - accuracy: 0.5657 - val_loss: 0.9580 - val_accuracy: 0.5980 Epoch 85/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0218 - accuracy: 0.5657 - val_loss: 0.9900 - val_accuracy: 0.5907 Epoch 86/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0049 - accuracy: 0.5710 - val_loss: 0.9743 - val_accuracy: 0.5858 Epoch 87/250 40/40 [==============================] - 0s 8ms/step - loss: 0.9949 - accuracy: 0.5710 - val_loss: 0.9797 - val_accuracy: 0.6152 Epoch 88/250 40/40 [==============================] - 0s 8ms/step - loss: 1.0077 - accuracy: 0.5468 - val_loss: 0.9726 - val_accuracy: 0.6152 Epoch 89/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0073 - accuracy: 0.5552 - val_loss: 0.9901 - val_accuracy: 0.6152 Epoch 90/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0229 - accuracy: 0.5783 - val_loss: 0.9732 - val_accuracy: 0.5882 Epoch 91/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0289 - accuracy: 0.5668 - val_loss: 0.9912 - val_accuracy: 0.6054 Epoch 92/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0213 - accuracy: 0.5731 - val_loss: 1.0258 - val_accuracy: 0.5882 Epoch 93/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9944 - accuracy: 0.5920 - val_loss: 0.9707 - val_accuracy: 0.6054 Epoch 94/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9993 - accuracy: 0.5720 - val_loss: 0.9682 - val_accuracy: 0.5931 Epoch 95/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9918 - accuracy: 0.5815 - val_loss: 0.9813 - val_accuracy: 0.5931 Epoch 96/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0184 - accuracy: 0.5552 - val_loss: 1.0086 - val_accuracy: 0.6029 Epoch 97/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0037 - accuracy: 0.5689 - val_loss: 0.9745 - val_accuracy: 0.5907 Epoch 98/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9996 - accuracy: 0.5773 - val_loss: 0.9852 - val_accuracy: 0.5907 Epoch 99/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0144 - accuracy: 0.5289 - val_loss: 0.9736 - val_accuracy: 0.5980 Epoch 100/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9839 - accuracy: 0.5752 - val_loss: 0.9712 - val_accuracy: 0.6005 Epoch 101/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9988 - accuracy: 0.5762 - val_loss: 0.9707 - val_accuracy: 0.6103 Epoch 102/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9882 - accuracy: 0.5857 - val_loss: 0.9955 - val_accuracy: 0.5882 Epoch 103/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0046 - accuracy: 0.5615 - val_loss: 0.9719 - val_accuracy: 0.6029 Epoch 104/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9954 - accuracy: 0.5710 - val_loss: 1.0038 - val_accuracy: 0.5956 Epoch 105/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0047 - accuracy: 0.5762 - val_loss: 0.9877 - val_accuracy: 0.6005 Epoch 106/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0070 - accuracy: 0.5710 - val_loss: 0.9946 - val_accuracy: 0.5882 Epoch 107/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9971 - accuracy: 0.5836 - val_loss: 0.9997 - val_accuracy: 0.5956 Epoch 108/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9900 - accuracy: 0.5573 - val_loss: 0.9944 - val_accuracy: 0.6029 Epoch 109/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0100 - accuracy: 0.5731 - val_loss: 1.0461 - val_accuracy: 0.5858 Epoch 110/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0016 - accuracy: 0.5689 - val_loss: 1.0009 - val_accuracy: 0.5833 Epoch 111/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0066 - accuracy: 0.5563 - val_loss: 0.9630 - val_accuracy: 0.5882 Epoch 112/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0146 - accuracy: 0.5699 - val_loss: 0.9644 - val_accuracy: 0.5980 Epoch 113/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0108 - accuracy: 0.5699 - val_loss: 0.9663 - val_accuracy: 0.5931 Epoch 114/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9927 - accuracy: 0.5815 - val_loss: 0.9727 - val_accuracy: 0.5931 Epoch 115/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0066 - accuracy: 0.5636 - val_loss: 0.9696 - val_accuracy: 0.5931 Epoch 116/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0016 - accuracy: 0.5594 - val_loss: 0.9764 - val_accuracy: 0.5931 Epoch 117/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0027 - accuracy: 0.5510 - val_loss: 0.9689 - val_accuracy: 0.5980 Epoch 118/250 40/40 [==============================] - 0s 6ms/step - loss: 1.0049 - accuracy: 0.5741 - val_loss: 0.9799 - val_accuracy: 0.6029 Epoch 119/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0071 - accuracy: 0.5647 - val_loss: 0.9818 - val_accuracy: 0.5882 Epoch 120/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9799 - accuracy: 0.5794 - val_loss: 0.9792 - val_accuracy: 0.6029 Epoch 121/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9922 - accuracy: 0.5794 - val_loss: 0.9872 - val_accuracy: 0.6054 Epoch 122/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0063 - accuracy: 0.5563 - val_loss: 0.9793 - val_accuracy: 0.5980 Epoch 123/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0038 - accuracy: 0.5668 - val_loss: 0.9997 - val_accuracy: 0.6005 Epoch 124/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0029 - accuracy: 0.5804 - val_loss: 0.9794 - val_accuracy: 0.5931 Epoch 125/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9826 - accuracy: 0.5678 - val_loss: 0.9922 - val_accuracy: 0.5980 Epoch 126/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0078 - accuracy: 0.5699 - val_loss: 1.0110 - val_accuracy: 0.6127 Epoch 127/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9932 - accuracy: 0.5563 - val_loss: 0.9849 - val_accuracy: 0.6029 Epoch 128/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9612 - accuracy: 0.5825 - val_loss: 0.9831 - val_accuracy: 0.6005 Epoch 129/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9720 - accuracy: 0.5878 - val_loss: 1.0389 - val_accuracy: 0.6029 Epoch 130/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0008 - accuracy: 0.5510 - val_loss: 1.0219 - val_accuracy: 0.6054 Epoch 131/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9951 - accuracy: 0.5647 - val_loss: 0.9662 - val_accuracy: 0.6078 Epoch 132/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9945 - accuracy: 0.5794 - val_loss: 0.9605 - val_accuracy: 0.5980 Epoch 133/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0132 - accuracy: 0.5563 - val_loss: 0.9775 - val_accuracy: 0.6078 Epoch 134/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9776 - accuracy: 0.5868 - val_loss: 0.9766 - val_accuracy: 0.6054 Epoch 135/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0047 - accuracy: 0.5804 - val_loss: 0.9947 - val_accuracy: 0.6054 Epoch 136/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9968 - accuracy: 0.5647 - val_loss: 0.9870 - val_accuracy: 0.6054 Epoch 137/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0101 - accuracy: 0.5710 - val_loss: 0.9888 - val_accuracy: 0.5956 Epoch 138/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9861 - accuracy: 0.5836 - val_loss: 0.9937 - val_accuracy: 0.6054 Epoch 139/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9972 - accuracy: 0.5720 - val_loss: 0.9798 - val_accuracy: 0.6127 Epoch 140/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9799 - accuracy: 0.5720 - val_loss: 0.9880 - val_accuracy: 0.5858 Epoch 141/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9929 - accuracy: 0.5710 - val_loss: 0.9849 - val_accuracy: 0.6078 Epoch 142/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9774 - accuracy: 0.5647 - val_loss: 0.9796 - val_accuracy: 0.6029 Epoch 143/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9823 - accuracy: 0.5804 - val_loss: 0.9800 - val_accuracy: 0.5956 Epoch 144/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9992 - accuracy: 0.5710 - val_loss: 0.9833 - val_accuracy: 0.5956 Epoch 145/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0175 - accuracy: 0.5699 - val_loss: 1.0071 - val_accuracy: 0.6078 Epoch 146/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0000 - accuracy: 0.5647 - val_loss: 0.9908 - val_accuracy: 0.6005 Epoch 147/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9804 - accuracy: 0.5752 - val_loss: 0.9774 - val_accuracy: 0.5833 Epoch 148/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9764 - accuracy: 0.5647 - val_loss: 0.9854 - val_accuracy: 0.6078 Epoch 149/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9918 - accuracy: 0.5741 - val_loss: 0.9675 - val_accuracy: 0.5956 Epoch 150/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9910 - accuracy: 0.5752 - val_loss: 0.9648 - val_accuracy: 0.5931 Epoch 151/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9973 - accuracy: 0.5678 - val_loss: 0.9644 - val_accuracy: 0.6078 Epoch 152/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0126 - accuracy: 0.5457 - val_loss: 0.9682 - val_accuracy: 0.6103 Epoch 153/250 40/40 [==============================] - 0s 7ms/step - loss: 1.0003 - accuracy: 0.5689 - val_loss: 0.9617 - val_accuracy: 0.6152 Epoch 154/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9892 - accuracy: 0.5804 - val_loss: 0.9799 - val_accuracy: 0.5931 Epoch 155/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9846 - accuracy: 0.5920 - val_loss: 0.9738 - val_accuracy: 0.6029 Epoch 156/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9964 - accuracy: 0.5794 - val_loss: 0.9653 - val_accuracy: 0.6054 Epoch 157/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9764 - accuracy: 0.5668 - val_loss: 0.9747 - val_accuracy: 0.6078 Epoch 158/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9784 - accuracy: 0.5668 - val_loss: 0.9804 - val_accuracy: 0.6103 Epoch 159/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9970 - accuracy: 0.5699 - val_loss: 0.9807 - val_accuracy: 0.6005 Epoch 160/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9896 - accuracy: 0.5783 - val_loss: 1.0180 - val_accuracy: 0.6054 Epoch 161/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9793 - accuracy: 0.5804 - val_loss: 0.9709 - val_accuracy: 0.6103 Epoch 162/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9763 - accuracy: 0.5762 - val_loss: 0.9752 - val_accuracy: 0.6029 Epoch 163/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9931 - accuracy: 0.5573 - val_loss: 0.9718 - val_accuracy: 0.6054 Epoch 164/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9815 - accuracy: 0.5741 - val_loss: 1.0033 - val_accuracy: 0.5931 Epoch 165/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9936 - accuracy: 0.5741 - val_loss: 0.9972 - val_accuracy: 0.6005 Epoch 166/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9852 - accuracy: 0.5815 - val_loss: 1.0041 - val_accuracy: 0.5931 Epoch 167/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9840 - accuracy: 0.5531 - val_loss: 0.9774 - val_accuracy: 0.5980 Epoch 168/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9865 - accuracy: 0.5773 - val_loss: 0.9831 - val_accuracy: 0.6005 Epoch 169/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0039 - accuracy: 0.5615 - val_loss: 1.0163 - val_accuracy: 0.6029 Epoch 170/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9856 - accuracy: 0.5605 - val_loss: 0.9748 - val_accuracy: 0.6103 Epoch 171/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9865 - accuracy: 0.5752 - val_loss: 0.9969 - val_accuracy: 0.5833 Epoch 172/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9745 - accuracy: 0.5804 - val_loss: 0.9940 - val_accuracy: 0.6054 Epoch 173/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9695 - accuracy: 0.5836 - val_loss: 0.9727 - val_accuracy: 0.6127 Epoch 174/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9725 - accuracy: 0.5983 - val_loss: 1.0018 - val_accuracy: 0.6127 Epoch 175/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9677 - accuracy: 0.5899 - val_loss: 0.9989 - val_accuracy: 0.5980 Epoch 176/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9873 - accuracy: 0.5647 - val_loss: 0.9943 - val_accuracy: 0.6054 Epoch 177/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9526 - accuracy: 0.5657 - val_loss: 1.0038 - val_accuracy: 0.5956 Epoch 178/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9881 - accuracy: 0.5741 - val_loss: 0.9740 - val_accuracy: 0.5907 Epoch 179/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9984 - accuracy: 0.5857 - val_loss: 0.9804 - val_accuracy: 0.5882 Epoch 180/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9658 - accuracy: 0.5868 - val_loss: 0.9856 - val_accuracy: 0.5980 Epoch 181/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9708 - accuracy: 0.5783 - val_loss: 0.9884 - val_accuracy: 0.5882 Epoch 182/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9953 - accuracy: 0.5636 - val_loss: 1.0905 - val_accuracy: 0.6054 Epoch 183/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9920 - accuracy: 0.5699 - val_loss: 0.9996 - val_accuracy: 0.5956 Epoch 184/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9832 - accuracy: 0.5752 - val_loss: 0.9685 - val_accuracy: 0.6103 Epoch 185/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9709 - accuracy: 0.5689 - val_loss: 0.9637 - val_accuracy: 0.5931 Epoch 186/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9622 - accuracy: 0.5752 - val_loss: 0.9495 - val_accuracy: 0.6029 Epoch 187/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9736 - accuracy: 0.5699 - val_loss: 0.9487 - val_accuracy: 0.6103 Epoch 188/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9779 - accuracy: 0.5699 - val_loss: 0.9579 - val_accuracy: 0.5980 Epoch 189/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9955 - accuracy: 0.5584 - val_loss: 0.9983 - val_accuracy: 0.5907 Epoch 190/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9893 - accuracy: 0.5783 - val_loss: 1.0072 - val_accuracy: 0.5907 Epoch 191/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9852 - accuracy: 0.5731 - val_loss: 0.9861 - val_accuracy: 0.5956 Epoch 192/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9684 - accuracy: 0.5868 - val_loss: 0.9867 - val_accuracy: 0.5956 Epoch 193/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9792 - accuracy: 0.5741 - val_loss: 1.0155 - val_accuracy: 0.6078 Epoch 194/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9942 - accuracy: 0.5731 - val_loss: 0.9809 - val_accuracy: 0.5980 Epoch 195/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9858 - accuracy: 0.5762 - val_loss: 0.9669 - val_accuracy: 0.6078 Epoch 196/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9761 - accuracy: 0.5868 - val_loss: 0.9837 - val_accuracy: 0.5956 Epoch 197/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9900 - accuracy: 0.5762 - val_loss: 0.9771 - val_accuracy: 0.5980 Epoch 198/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9719 - accuracy: 0.5678 - val_loss: 0.9661 - val_accuracy: 0.5980 Epoch 199/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9639 - accuracy: 0.5794 - val_loss: 0.9709 - val_accuracy: 0.6005 Epoch 200/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9937 - accuracy: 0.5773 - val_loss: 0.9834 - val_accuracy: 0.6029 Epoch 201/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9759 - accuracy: 0.5752 - val_loss: 1.0000 - val_accuracy: 0.6005 Epoch 202/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9615 - accuracy: 0.5647 - val_loss: 0.9750 - val_accuracy: 0.6054 Epoch 203/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9775 - accuracy: 0.5699 - val_loss: 0.9540 - val_accuracy: 0.6078 Epoch 204/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9757 - accuracy: 0.5920 - val_loss: 0.9677 - val_accuracy: 0.6078 Epoch 205/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9729 - accuracy: 0.5741 - val_loss: 0.9670 - val_accuracy: 0.6005 Epoch 206/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9712 - accuracy: 0.5794 - val_loss: 0.9994 - val_accuracy: 0.5858 Epoch 207/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9860 - accuracy: 0.5636 - val_loss: 1.0025 - val_accuracy: 0.5931 Epoch 208/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9897 - accuracy: 0.5773 - val_loss: 1.0010 - val_accuracy: 0.5980 Epoch 209/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9909 - accuracy: 0.5689 - val_loss: 0.9903 - val_accuracy: 0.5833 Epoch 210/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9759 - accuracy: 0.5794 - val_loss: 1.0030 - val_accuracy: 0.5760 Epoch 211/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9503 - accuracy: 0.5868 - val_loss: 0.9683 - val_accuracy: 0.6005 Epoch 212/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9922 - accuracy: 0.5762 - val_loss: 0.9700 - val_accuracy: 0.6078 Epoch 213/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9779 - accuracy: 0.5783 - val_loss: 0.9673 - val_accuracy: 0.5833 Epoch 214/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9557 - accuracy: 0.5825 - val_loss: 0.9646 - val_accuracy: 0.6005 Epoch 215/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9439 - accuracy: 0.5952 - val_loss: 0.9679 - val_accuracy: 0.5980 Epoch 216/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9839 - accuracy: 0.5678 - val_loss: 0.9806 - val_accuracy: 0.6005 Epoch 217/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9652 - accuracy: 0.5899 - val_loss: 0.9604 - val_accuracy: 0.6103 Epoch 218/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9939 - accuracy: 0.5615 - val_loss: 0.9749 - val_accuracy: 0.6078 Epoch 219/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9517 - accuracy: 0.5889 - val_loss: 0.9702 - val_accuracy: 0.6029 Epoch 220/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9725 - accuracy: 0.5762 - val_loss: 0.9807 - val_accuracy: 0.6054 Epoch 221/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9840 - accuracy: 0.5794 - val_loss: 0.9710 - val_accuracy: 0.6201 Epoch 222/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9555 - accuracy: 0.5773 - val_loss: 0.9774 - val_accuracy: 0.6029 Epoch 223/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9735 - accuracy: 0.5825 - val_loss: 0.9879 - val_accuracy: 0.6054 Epoch 224/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9779 - accuracy: 0.5668 - val_loss: 1.0050 - val_accuracy: 0.5931 Epoch 225/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9779 - accuracy: 0.5741 - val_loss: 0.9747 - val_accuracy: 0.6005 Epoch 226/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9906 - accuracy: 0.5825 - val_loss: 0.9869 - val_accuracy: 0.6127 Epoch 227/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9919 - accuracy: 0.5573 - val_loss: 1.0006 - val_accuracy: 0.6054 Epoch 228/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9585 - accuracy: 0.5878 - val_loss: 1.0134 - val_accuracy: 0.5956 Epoch 229/250 40/40 [==============================] - 0s 8ms/step - loss: 0.9810 - accuracy: 0.5699 - val_loss: 0.9926 - val_accuracy: 0.5931 Epoch 230/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9790 - accuracy: 0.5752 - val_loss: 0.9436 - val_accuracy: 0.6054 Epoch 231/250 40/40 [==============================] - 0s 7ms/step - loss: 0.9791 - accuracy: 0.5689 - val_loss: 0.9480 - val_accuracy: 0.5931 Epoch 232/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9646 - accuracy: 0.5846 - val_loss: 0.9613 - val_accuracy: 0.6005 Epoch 233/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9763 - accuracy: 0.5773 - val_loss: 0.9562 - val_accuracy: 0.5907 Epoch 234/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9456 - accuracy: 0.6078 - val_loss: 0.9881 - val_accuracy: 0.6005 Epoch 235/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9637 - accuracy: 0.5868 - val_loss: 0.9548 - val_accuracy: 0.5907 Epoch 236/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9683 - accuracy: 0.5783 - val_loss: 0.9951 - val_accuracy: 0.6005 Epoch 237/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9811 - accuracy: 0.5647 - val_loss: 0.9704 - val_accuracy: 0.6103 Epoch 238/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9662 - accuracy: 0.5794 - val_loss: 0.9770 - val_accuracy: 0.5931 Epoch 239/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9748 - accuracy: 0.5720 - val_loss: 1.0041 - val_accuracy: 0.6054 Epoch 240/250 40/40 [==============================] - 0s 5ms/step - loss: 1.0033 - accuracy: 0.5573 - val_loss: 1.0230 - val_accuracy: 0.5882 Epoch 241/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9731 - accuracy: 0.5699 - val_loss: 0.9723 - val_accuracy: 0.6029 Epoch 242/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9656 - accuracy: 0.5594 - val_loss: 0.9781 - val_accuracy: 0.6029 Epoch 243/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9538 - accuracy: 0.5952 - val_loss: 0.9697 - val_accuracy: 0.5980 Epoch 244/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9684 - accuracy: 0.5846 - val_loss: 1.0026 - val_accuracy: 0.6176 Epoch 245/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9739 - accuracy: 0.5720 - val_loss: 0.9639 - val_accuracy: 0.5956 Epoch 246/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9678 - accuracy: 0.5605 - val_loss: 0.9750 - val_accuracy: 0.6127 Epoch 247/250 40/40 [==============================] - 0s 6ms/step - loss: 0.9781 - accuracy: 0.5499 - val_loss: 0.9658 - val_accuracy: 0.6005 Epoch 248/250 40/40 [==============================] - 0s 5ms/step - loss: 0.9684 - accuracy: 0.5783 - val_loss: 0.9732 - val_accuracy: 0.6029 Epoch 249/250 40/40 [==============================] - 0s 4ms/step - loss: 1.0039 - accuracy: 0.5584 - val_loss: 1.0509 - val_accuracy: 0.5956 Epoch 250/250 40/40 [==============================] - 0s 4ms/step - loss: 0.9714 - accuracy: 0.5857 - val_loss: 0.9779 - val_accuracy: 0.6103
# Plot the training and validation loss
plt.plot(history_v2.history['loss'])
plt.plot(history_v2.history['val_loss'])
plt.title('Training Loss vs. Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend(['Training Loss', 'Validation Loss'])
plt.show()
# Plot the training and validation accuracy
plt.plot(history_v2.history['accuracy'])
plt.plot(history_v2.history['val_accuracy'])
plt.title('Training Accuracy vs. Validation Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend(['Training Accuracy', 'Validation Accuracy'])
plt.show()
In the first model(sd_cls_model), 2 dense layers with tanh activation & sigmoid was used in output layer. Adam was used as optimizer. The accuracy was ranging about 50%
In the next model(sd_cls_model_v1), 4 dense layers were used with relu activation function. Softmax was used as activation function in the output layer. Due to introduction of droput, noise was induced in the layer. SGD was used as optimizer. However, the accuracy has mrginaly improved from previous result.
In the final model(sd_cls_model_v2), 3 dense layers were used with relu as activation function. Output layer used sigmoid activation fucntion. Adam optimizer was used. Number of epochs increased from 100 to 250. This provide better accuracy of about 60% compared to the aboe models.
A. Read the .h5 file and assign to a variable
import io
import h5py
path = '/content/drive/My Drive/AIML Course/Introduction to Neural Networks/'
# Open the file as readonly
h5f = h5py.File(path + 'Autonomous_Vehicles_SVHN_single_grey1.h5', 'r')
B. Print all the keys from the .h5 file
h5f.keys()
<KeysViewHDF5 ['X_test', 'X_train', 'X_val', 'y_test', 'y_train', 'y_val']>
C. Split the data into X_train, X_test, Y_train, Y_test
X_train = h5f['X_train'][:]
y_train = h5f['y_train'][:]
X_test = h5f['X_test'][:]
y_test = h5f['y_test'][:]
A. Print shape of all the 4 data split into x, y, train, test to verify if x & y is in sync
print('X_train' ,X_train.shape)
print('y_train' ,y_train.shape)
print('X_test' ,X_test.shape)
print('y_test' ,y_test.shape)
X_train (42000, 32, 32) y_train (42000,) X_test (18000, 32, 32) y_test (18000,)
B. Visualise first 10 images in train data and print its corresponding labels
%matplotlib inline
import matplotlib.pyplot as plt
columns=10
rows=10
fig=plt.figure(figsize=(12, 8))
for i in range(1,columns*rows+1):
img=X_train[i]
fig.add_subplot(rows,columns,i)
print(y_train[i],end=' ')
if i % columns == 0:
print ("")
plt.imshow(img,cmap='gray')
plt.show()
6 7 4 4 0 3 0 7 3 1 0 1 3 1 1 0 0 8 4 6 5 7 9 1 0 3 0 7 2 1 1 0 2 9 0 2 5 1 3 2 7 9 8 4 9 4 5 9 3 4 0 5 5 8 3 6 6 0 0 6 8 8 3 1 4 7 0 2 9 4 8 7 3 9 4 4 3 3 4 4 8 6 8 7 4 0 4 4 4 1 4 2 7 4 2 9 1 9 1 0
C. Reshape all the images with appropriate shape update the data in same variable
X_train.shape
(42000, 32, 32)
#Reshaping train and test data from 3d to 2d numpy array
X_train = X_train.reshape((X_train.shape[0], -1))
X_test = X_test.reshape((X_test.shape[0], -1))
X_test.shape
(18000, 1024)
print("X_train",X_train.shape," X_test", X_test.shape)
X_train (42000, 1024) X_test (18000, 1024)
D. Normalise the images i.e. Normalise the pixel values
# converting the images from ..input/>.h5 file to numpy array
data = np.array(h5f)
test_x = np.array(h5f['X_test'])
train_x = np.array(h5f['X_train'])
test_y = np.array(h5f['y_test'])
train_y = np.array(h5f['y_train'])
# Cheking type of converted array
print(type(test_x))
print(type(train_x))
print(type(test_y))
print(type(train_y))
<class 'numpy.ndarray'> <class 'numpy.ndarray'> <class 'numpy.ndarray'> <class 'numpy.ndarray'>
# use min-max scaler to normalise the pixel values (make max value as 1 and min value as 0)
scale_minmax=MinMaxScaler()
train_x=scale_minmax.fit_transform(train_x.reshape(-1,train_x.shape[-1])).reshape(train_x.shape)
test_x=scale_minmax.transform(test_x.reshape(-1,test_x.shape[-1])).reshape(test_x.shape)
# Checking Lowest and Highest Pixel Value after normalization of images
print("Lowest Pixel Value:", train_x[0].min(),"Highest Pixel Value:", train_x[0].max())
Lowest Pixel Value: 0.049835496 Highest Pixel Value: 0.51063794
2.E. Transform Labels into format acceptable by Neural Network
# converting y data into categorical (one-hot encoding)
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)
print('y_train',y_train.shape, 'y_test',y_test.shape)
y_train (42000, 10) y_test (18000, 10)
2.F. Print total Number of classes in the Dataset
num_classes = len(np.unique(train_y))
print("Number of classes:", num_classes)
Number of classes: 10
train_y.shape
(42000,)
A. Design a Neural Network to train a classifier
train_x_flat = train_x.reshape(X_train.shape[0], -1)
test_x_flat = test_x.reshape(X_test.shape[0], -1)
test_x_flat.shape
(18000, 1024)
train_x_flat.shape
(42000, 1024)
model_cls = Sequential()
model_cls.add(Dense(32, input_shape=(1024,)))
model_cls.add(Activation('relu'))
model_cls.add(Dense(16))
model_cls.add(Activation('relu'))
model_cls.add(Dense(16))
model_cls.add(Activation('relu'))
model_cls.add(Dense(16))
model_cls.add(Activation('relu'))
model_cls.add(Dense(10))
model_cls.add(Activation('softmax'))
sgd = optimizers.SGD(learning_rate = 0.001)
model_cls.compile(optimizer = sgd, loss = 'categorical_crossentropy', metrics = ['accuracy'])
B. Train the classifier using previously designed Architecture (Use best suitable parameters).
hist_cl_hi5=model_cls.fit(train_x_flat,y_train,batch_size=20,epochs=50,verbose=0,validation_data=(test_x_flat,y_test))
C. Evaluate performance of the model with appropriate metrics.
result_cls = model_cls.evaluate(test_x_flat,y_test)
563/563 [==============================] - 1s 2ms/step - loss: 1.0585 - accuracy: 0.6723
result_cls_train_acc = model_cls.evaluate(train_x_flat,y_train)
1313/1313 [==============================] - 2s 2ms/step - loss: 1.0150 - accuracy: 0.6833
D. Plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs plot and write your observations on the same
# Plot the training and validation loss
plt.plot(hist_cl_hi5.history['loss'])
plt.plot(hist_cl_hi5.history['val_loss'])
plt.title('Training Loss vs. Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend(['Training Loss', 'Validation Loss'])
plt.show()
# Plot the training and validation accuracy
plt.plot(hist_cl_hi5.history['accuracy'])
plt.plot(hist_cl_hi5.history['val_accuracy'])
plt.title('Training Accuracy vs. Validation Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend(['Training Accuracy', 'Validation Accuracy'])
plt.show()
hist_cl_hi5_v1=model_cls.fit(train_x_flat,y_train,batch_size=10,epochs=50,verbose=0,validation_data=(test_x_flat,y_test))
result_cls_v1 = model_cls.evaluate(test_x_flat,y_test)
563/563 [==============================] - 2s 4ms/step - loss: 0.8143 - accuracy: 0.7663
result_cls_train_acc_v1 = model_cls.evaluate(train_x_flat,y_train)
1313/1313 [==============================] - 4s 3ms/step - loss: 0.7142 - accuracy: 0.7878
# Plot the training and validation loss
plt.plot(hist_cl_hi5_v1.history['loss'])
plt.plot(hist_cl_hi5_v1.history['val_loss'])
plt.title('Training Loss vs. Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend(['Training Loss', 'Validation Loss'])
plt.show()
# Plot the training and validation accuracy
plt.plot(hist_cl_hi5_v1.history['accuracy'])
plt.plot(hist_cl_hi5_v1.history['val_accuracy'])
plt.title('Training Accuracy vs. Validation Accuracy')
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend(['Training Accuracy', 'Validation Accuracy'])
plt.show()
The Model was created with 1 Input layer, 2 dense layers & 1 output later. Dense layer has actiavtion function as relu, output function used softmax as activation function:
In the first model training (hist_cl_hi5), batch size used was 20 with 50 epochs. Accuracy achieved was 67% which is at par with training accuracy of 68%
In the second model training (hist_cl_hi5_v1), batch size was reduced to 10 with 50 epochs. The accuracy has improved to about 76% which is close to training accuracy of 78%. However, this training result has more noise compared ot the earlier one. The earlier trained model is comparitively better.